Kinetic samplers for neural quantum states

نویسندگان

چکیده

Neural quantum states (NQS) are a novel class of variational many-body wave functions that very flexible in approximating diverse states. Optimization an NQS ansatz requires sampling from the corresponding probability distribution defined by squared function amplitude. For this purpose we propose to use kinetic protocols and demonstrate many important cases such methods lead much smaller autocorrelation times than Metropolis-Hastings algorithm while still allowing easily implement lattice symmetries (unlike autoregressive models). We also Uniform Manifold Approximation Projection construct two-dimensional isometric embedding Markov chains show helps attain more homogeneous ergodic coverage Hilbert space basis.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kinetic theory for quantum plasmas

Quantum plasmas have been studied theoretically for more than four decades. Important early applications include the electron gas in metals and the electron-hole plasma in semiconductors and dimensionally reduced nanostructures. Experiments with lasers and ion beams are now making dense quantum plasmas of electrons and (classical) ions accessible in the laboratory giving rise to increased theor...

متن کامل

Kinetic theory for quantum nanosystems

Thèse présentée en vue de l'obtention du grade académique de Docteur en Sciences Réalisé sous la direction de

متن کامل

Quantum Networks for Generating Arbitrary Quantum States

Many results in quantum information theory require the generation of specific quantum states, such as EPR pairs, or the implementation of specific quantum measurements, such as a von Neumann measurement in a Fourier transformed basis. Some states and measurements can be efficiently implemented using standard quantum computational primitives such as preparing a qubit in the state |0〉 and applyin...

متن کامل

f-GAN: Training Generative Neural Samplers using Variational Divergence Minimization

Generative neural samplers are probabilistic models that implement sampling using feedforward neural networks: they take a random input vector and produce a sample from a probability distribution defined by the network weights. These models are expressive and allow efficient computation of samples and derivatives, but cannot be used for computing likelihoods or for marginalization. The generati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Physical Review B

سال: 2021

ISSN: ['1098-0121', '1550-235X', '1538-4489']

DOI: https://doi.org/10.1103/physrevb.104.104407